Efficient Multiplicative Updates for Support Vector Machines
نویسندگان
چکیده
The dual formulation of the support vector machine (SVM) objective function is an instance of a nonnegative quadratic programming problem. We reformulate the SVM objective function as a matrix factorization problem which establishes a connection with the regularized nonnegative matrix factorization (NMF) problem. This allows us to derive a novel multiplicative algorithm for solving hard and soft margin SVM. The algorithm follows as a natural extension of the updates for NMF and semi-NMF. No additional parameter setting, such as choosing learning rate, is required. Exploiting the connection between SVM and NMF formulation, we show how NMF algorithms can be applied to the SVM problem. Multiplicative updates that we derive for SVM problem also represent novel updates for semi-NMF. Further this unified view yields algorithmic insights in both directions: we demonstrate that the Kernel Adatron algorithm for solving SVMs can be adapted to NMF problems. Experiments demonstrate rapid convergence to good classifiers. We analyze the rates of asymptotic convergence of the updates and establish tight bounds. We test them on several datasets using various kernels and report equivalent classification performance to that of a standard SVM.
منابع مشابه
Multiplicative Updates for Nonnegative Quadratic Programming in Support Vector Machines
We derive multiplicative updates for solving the nonnegative quadratic programming problem in support vector machines (SVMs). The updates have a simple closed form, and we prove that they converge monotonically to the solution of the maximum margin hyperplane. The updates optimize the traditionally proposed objective function for SVMs. They do not involve any heuristics such as choosing a learn...
متن کاملEnsembles of Partially Trained SVMs with Multiplicative Updates
The training of support vector machines (SVM) involves a quadratic programming problem, which is often optimized by a complicated numerical solver. In this paper, we propose a much simpler approach based on multiplicative updates. This idea was first explored in [Cristianini et al., 1999], but its convergence is sensitive to a learning rate that has to be fixed manually. Moreover, the update ru...
متن کاملMultiplicative updates For Non-Negative Kernel SVM
We present multiplicative updates for solving hard and soft margin support vector machines (SVM) with non-negative kernels. They follow as a natural extension of the updates for non-negative matrix factorization. No additional parameter setting, such as choosing learning, rate is required. Experiments demonstrate rapid convergence to good classifiers. We analyze the rates of asymptotic converge...
متن کاملMining Biological Repetitive Sequences Using Support Vector Machines and Fuzzy SVM
Structural repetitive subsequences are most important portion of biological sequences, which play crucial roles on corresponding sequence’s fold and functionality. Biggest class of the repetitive subsequences is “Transposable Elements” which has its own sub-classes upon contexts’ structures. Many researches have been performed to criticality determine the structure and function of repetitiv...
متن کاملFeature Selection for Classification using Transductive Support Vector Machines
Given unlabeled data in advance, transductive feature selection (TFS) is to maximize the classification accuracy on these particular unlabeled data by selecting a small set of relevant and less redundant features. Specifically, this paper introduces the use of Transductive Support Vector Machines(TSVMs) for feature selection. We study three inductive SVM-related feature selection methods: corre...
متن کامل